2,052 research outputs found

    F1000 recommendations as a new data source for research evaluation: A comparison with citations

    Get PDF
    F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations. More research is needed to identify the main reasons for differences between recommendations and citations in assessing the impact of publications

    Hacia los estudios de medios sociales de la ciencia: las métricas de los medios sociales, presente y futuro

    Get PDF
    During the last years a new research topic has rapidly emerged in the field of scientometrics. This new topic, popularly known as altmetrics, was first proposed in the Altmetrics manifesto (Priem et al., 2010). Since its proposal, altmetrics has been a concept of difficult definition (Haustein, Bowman & Costas, 2016), even being considered as “a good idea, but a bad name” (Rousseau & Ye, 2013). Altmetrics have been usually related to new metrics around scholarly objects captured through events recorded in online social media platforms (Haustein et al., 2016). However, the large diversity of sources and metrics that fall within the realm of altmetrics has made it hard to come up with a consensus of what can be considered as altmetrics (Haustein et al., 2016)

    Towards the social media studies of science: social media metrics, present and future

    Get PDF
    In this paper we aim at providing a general reflection around the present and future of social media metrics (or altmetrics) and how they could evolve into a new discipline focused on the study of the relationships and interactions between science and social media, in what could be seen as the social media studies of science.Comment: Spanish version: http://revistas.bnjm.cu/index.php/anales/article/view/417

    What makes papers visible on social media? An analysis of various document characteristics

    Get PDF
    In this study we have investigated the relationship between different document characteristics and the number of Mendeley readership counts, tweets, Facebook posts, mentions in blogs and mainstream media for 1.3 million papers published in journals covered by the Web of Science (WoS). It aims to demonstrate that how factors affecting various social media-based indicators differ from those influencing citations and which document types are more popular across different platforms. Our results highlight the heterogeneous nature of altmetrics, which encompasses different types of uses and user groups engaging with research on social media.Comment: Presented at the 21th International Conference in Science & Technology Indicators (STI), 13-16, September, 2016, Valencia, Spai

    DataCite as a novel bibliometric source: Coverage, strengths and limitations

    Get PDF
    This paper explores the characteristics of DataCite to determine its possibilities and potential as a new bibliometric data source to analyze the scholarly production of open data. Open science and the increasing data sharing requirements from governments, funding bodies, institutions and scientific journals has led to a pressing demand for the development of data metrics. As a very first step towards reliable data metrics, we need to better comprehend the limitations and caveats of the information provided by sources of open data. In this paper, we critically examine records downloaded from the DataCite's OAI API and elaborate a series of recommendations regarding the use of this source for bibliometric analyses of open data. We highlight issues related to metadata incompleteness, lack of standardization, and ambiguous definitions of several fields. Despite these limitations, we emphasize DataCite's value and potential to become one of the main sources for data metrics development.Comment: Paper accepted for publication in Journal of Informetric
    corecore